494 research outputs found

    A "metric" complexity for weakly chaotic systems

    Full text link
    We consider the number of Bowen sets which are necessary to cover a large measure subset of the phase space. This introduce some complexity indicator characterizing different kind of (weakly) chaotic dynamics. Since in many systems its value is given by a sort of local entropy, this indicator is quite simple to be calculated. We give some example of calculation in nontrivial systems (interval exchanges, piecewise isometries e.g.) and a formula similar to the Ruelle-Pesin one, relating the complexity indicator to some initial condition sensitivity indicators playing the role of positive Lyapunov exponents.Comment: 15 pages, no figures. Articl

    Space-Time Complexity in Hamiltonian Dynamics

    Full text link
    New notions of the complexity function C(epsilon;t,s) and entropy function S(epsilon;t,s) are introduced to describe systems with nonzero or zero Lyapunov exponents or systems that exhibit strong intermittent behavior with ``flights'', trappings, weak mixing, etc. The important part of the new notions is the first appearance of epsilon-separation of initially close trajectories. The complexity function is similar to the propagator p(t0,x0;t,x) with a replacement of x by the natural lengths s of trajectories, and its introduction does not assume of the space-time independence in the process of evolution of the system. A special stress is done on the choice of variables and the replacement t by eta=ln(t), s by xi=ln(s) makes it possible to consider time-algebraic and space-algebraic complexity and some mixed cases. It is shown that for typical cases the entropy function S(epsilon;xi,eta) possesses invariants (alpha,beta) that describe the fractal dimensions of the space-time structures of trajectories. The invariants (alpha,beta) can be linked to the transport properties of the system, from one side, and to the Riemann invariants for simple waves, from the other side. This analog provides a new meaning for the transport exponent mu that can be considered as the speed of a Riemann wave in the log-phase space of the log-space-time variables. Some other applications of new notions are considered and numerical examples are presented.Comment: 27 pages, 6 figure

    Recurrence and algorithmic information

    Full text link
    In this paper we initiate a somewhat detailed investigation of the relationships between quantitative recurrence indicators and algorithmic complexity of orbits in weakly chaotic dynamical systems. We mainly focus on examples.Comment: 26 pages, no figure

    Sequence alignment, mutual information, and dissimilarity measures for constructing phylogenies

    Get PDF
    Existing sequence alignment algorithms use heuristic scoring schemes which cannot be used as objective distance metrics. Therefore one relies on measures like the p- or log-det distances, or makes explicit, and often simplistic, assumptions about sequence evolution. Information theory provides an alternative, in the form of mutual information (MI) which is, in principle, an objective and model independent similarity measure. MI can be estimated by concatenating and zipping sequences, yielding thereby the "normalized compression distance". So far this has produced promising results, but with uncontrolled errors. We describe a simple approach to get robust estimates of MI from global pairwise alignments. Using standard alignment algorithms, this gives for animal mitochondrial DNA estimates that are strikingly close to estimates obtained from the alignment free methods mentioned above. Our main result uses algorithmic (Kolmogorov) information theory, but we show that similar results can also be obtained from Shannon theory. Due to the fact that it is not additive, normalized compression distance is not an optimal metric for phylogenetics, but we propose a simple modification that overcomes the issue of additivity. We test several versions of our MI based distance measures on a large number of randomly chosen quartets and demonstrate that they all perform better than traditional measures like the Kimura or log-det (resp. paralinear) distances. Even a simplified version based on single letter Shannon entropies, which can be easily incorporated in existing software packages, gave superior results throughout the entire animal kingdom. But we see the main virtue of our approach in a more general way. For example, it can also help to judge the relative merits of different alignment algorithms, by estimating the significance of specific alignments.Comment: 19 pages + 16 pages of supplementary materia

    Complexity for extended dynamical systems

    Full text link
    We consider dynamical systems for which the spatial extension plays an important role. For these systems, the notions of attractor, epsilon-entropy and topological entropy per unit time and volume have been introduced previously. In this paper we use the notion of Kolmogorov complexity to introduce, for extended dynamical systems, a notion of complexity per unit time and volume which plays the same role as the metric entropy for classical dynamical systems. We introduce this notion as an almost sure limit on orbits of the system. Moreover we prove a kind of variational principle for this complexity.Comment: 29 page

    Complexity Characterization in a Probabilistic Approach to Dynamical Systems Through Information Geometry and Inductive Inference

    Full text link
    Information geometric techniques and inductive inference methods hold great promise for solving computational problems of interest in classical and quantum physics, especially with regard to complexity characterization of dynamical systems in terms of their probabilistic description on curved statistical manifolds. In this article, we investigate the possibility of describing the macroscopic behavior of complex systems in terms of the underlying statistical structure of their microscopic degrees of freedom by use of statistical inductive inference and information geometry. We review the Maximum Relative Entropy (MrE) formalism and the theoretical structure of the information geometrodynamical approach to chaos (IGAC) on statistical manifolds. Special focus is devoted to the description of the roles played by the sectional curvature, the Jacobi field intensity and the information geometrodynamical entropy (IGE). These quantities serve as powerful information geometric complexity measures of information-constrained dynamics associated with arbitrary chaotic and regular systems defined on the statistical manifold. Finally, the application of such information geometric techniques to several theoretical models are presented.Comment: 29 page

    M-GCAT: interactively and efficiently constructing large-scale multiple genome comparison frameworks in closely related species

    Get PDF
    BACKGROUND: Due to recent advances in whole genome shotgun sequencing and assembly technologies, the financial cost of decoding an organism's DNA has been drastically reduced, resulting in a recent explosion of genomic sequencing projects. This increase in related genomic data will allow for in depth studies of evolution in closely related species through multiple whole genome comparisons. RESULTS: To facilitate such comparisons, we present an interactive multiple genome comparison and alignment tool, M-GCAT, that can efficiently construct multiple genome comparison frameworks in closely related species. M-GCAT is able to compare and identify highly conserved regions in up to 20 closely related bacterial species in minutes on a standard computer, and as many as 90 (containing 75 cloned genomes from a set of 15 published enterobacterial genomes) in an hour. M-GCAT also incorporates a novel comparative genomics data visualization interface allowing the user to globally and locally examine and inspect the conserved regions and gene annotations. CONCLUSION: M-GCAT is an interactive comparative genomics tool well suited for quickly generating multiple genome comparisons frameworks and alignments among closely related species. M-GCAT is freely available for download for academic and non-commercial use at:

    Entropy and Quantum Kolmogorov Complexity: A Quantum Brudno's Theorem

    Full text link
    In classical information theory, entropy rate and Kolmogorov complexity per symbol are related by a theorem of Brudno. In this paper, we prove a quantum version of this theorem, connecting the von Neumann entropy rate and two notions of quantum Kolmogorov complexity, both based on the shortest qubit descriptions of qubit strings that, run by a universal quantum Turing machine, reproduce them as outputs.Comment: 26 pages, no figures. Reference to publication added: published in the Communications in Mathematical Physics (http://www.springerlink.com/content/1432-0916/

    Complex temporal patterns in molecular dynamics:a direct measure of the phase-space exploration by the trajectory at macroscopic time scales

    Get PDF
    Computer simulated trajectories of bulk water molecules form complex spatiotemporal structures at the picosecond time scale. This intrinsic complexity, which underlies the formation of molecular structures at longer time scales, has been quantified using a measure of statistical complexity. The method estimates the information contained in the molecular trajectory by detecting and quantifying temporal patterns present in the simulated data (velocity time series). Two types of temporal patterns are found. The first, defined by the short-time correlations corresponding to the velocity autocorrelation decay times (â‰0.1â€ps), remains asymptotically stable for time intervals longer than several tens of nanoseconds. The second is caused by previously unknown longer-time correlations (found at longer than the nanoseconds time scales) leading to a value of statistical complexity that slowly increases with time. A direct measure based on the notion of statistical complexity that describes how the trajectory explores the phase space and independent from the particular molecular signal used as the observed time series is introduced
    • …
    corecore